Add cases indexing controls and Apostrophe sitemap/robots setup#270
Add cases indexing controls and Apostrophe sitemap/robots setup#270IhorMasechko merged 4 commits intomainfrom
Conversation
|
Warning Rate limit exceeded
Your organization is not enrolled in usage-based pricing. Contact your admin to enable usage-based pricing to continue reviews beyond the rate limit, or try again in 22 minutes and 12 seconds. ⌛ How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. 🚦 How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. ℹ️ Review info⚙️ Run configurationConfiguration used: Organization UI Review profile: ASSERTIVE Plan: Pro Run ID: 📒 Files selected for processing (1)
WalkthroughAdds sitemap and robots support and injects SEO metadata for case studies. Registers Possibly related PRs
🚥 Pre-merge checks | ✅ 3✅ Passed checks (3 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
🔍 Vulnerabilities of
|
| digest | sha256:b2e2f93e2d481621218d5d378de7107f2381820b75ab8ef963f637b299d408b8 |
| vulnerabilities | |
| platform | linux/amd64 |
| size | 173 MB |
| packages | 973 |
📦 Base Image node:24-alpine
Description
Description
Description
Description
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Description
Description
Description
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Description
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Description
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Description
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Description
Description
Description
Description
Description
Description
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Description
Description
Description
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Description
Description
Description
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Description
Description
Description
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Description
Description
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Description
Description
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Description
Description
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Description
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Description
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Description
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Description
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Description
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Description
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Description
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Description
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Description
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Description
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Description
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Description
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Description
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Description
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Description
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Description
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Description
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Description
|
There was a problem hiding this comment.
Actionable comments posted: 3
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@website/modules/case-studies-page/views/show.html`:
- Around line 4-11: The template's hasQueryParams check is too narrow and misses
arbitrary query keys (e.g., utm_source); update the logic in show.html that
defines hasQueryParams so it detects any present query parameters beyond
expected ones by inspecting the keys of data.query (e.g., via Object.keys or
equivalent) and treating unknown keys as indicators to set noindex;
specifically, replace the fixed OR-list (data.query.search, industry, stack,
caseStudyType, partner, page) with a check that returns true when data.query
contains any key not in the allowed whitelist (search, industry, stack,
caseStudyType, partner, page) so detail pages with extra params will render
noindex.
In `@website/modules/robots/index.js`:
- Around line 17-19: Remove the two Disallow rules that block query variants of
the cases pages from the robots string (the lines containing 'Disallow:
/cases?*' and 'Disallow: /cases/*?*' in the robots generation/template) so
crawlers can fetch those pages and see your page-level noindex,follow and
canonical tags; leave the rest of the robots content (including Sitemap:
${baseUrl}/sitemap.xml) intact.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: ASSERTIVE
Plan: Pro
Run ID: 0edd6f27-29c7-42ec-b249-25b862cdc423
⛔ Files ignored due to path filters (1)
website/package-lock.jsonis excluded by!**/package-lock.json
📒 Files selected for processing (7)
website/app.jswebsite/modules/@apostrophecms/sitemap/index.jswebsite/modules/case-studies-page/index.jswebsite/modules/case-studies-page/views/index.htmlwebsite/modules/case-studies-page/views/show.htmlwebsite/modules/robots/index.jswebsite/package.json
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@website/modules/robots/index.js`:
- Around line 15-17: The production-host check in robots handler uses baseHost
=== 'www.speedandfunction.com' which will misidentify the real site when
BASE_URL defaults to 'https://speedandfunction.com'; update the check in the
robots route (variables: baseHost, isProductionHost, isProduction,
baseUrl/BASE_URL) to allow the canonical host by parsing the configured baseUrl
hostname (e.g., new URL(baseUrl).hostname) and compare against both
'speedandfunction.com' and 'www.speedandfunction.com' (or simply compare to the
parsed hostname), so that when isProduction is true and the parsed hostname
matches the real production hostname the handler serves normal robots instead of
Disallow: /.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: ASSERTIVE
Plan: Pro
Run ID: 41ac8dbd-b0c0-4252-9fab-02eb9962a1c0
📒 Files selected for processing (3)
website/modules/case-studies-page/services/UrlService.jswebsite/modules/case-studies-page/views/show.htmlwebsite/modules/robots/index.js
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@website/modules/robots/index.js`:
- Line 6: The baseUrl variable (const baseUrl = self.apos.baseUrl || '') can
include a trailing slash, causing double slashes when concatenating
'/sitemap.xml'; normalize baseUrl by removing any trailing slashes (e.g., trim
trailing '/' characters from baseUrl) before it’s used to build the sitemap URL
so concatenation like baseUrl + '/sitemap.xml' always yields a single slash.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: ASSERTIVE
Plan: Pro
Run ID: 13586c27-952b-48dc-a559-7d70b4cf2fca
📒 Files selected for processing (1)
website/modules/robots/index.js
|
@IhorMasechko i'm begging you to fix whatever made #270 (comment) so it doesn't at-mention people |
Summary
/caseslisting pages to prevent indexing of filter/search/pagination URL combinations.@apostrophecms/sitemapand configure robots handling for dev/prod behavior, including sitemap publication.What changed
website/app.js.website/modules/@apostrophecms/sitemap/index.js.index/noindex+ canonical) in:website/modules/case-studies-page/index.jswebsite/modules/case-studies-page/views/index.htmlwebsite/modules/case-studies-page/views/show.htmlrobots.txtroute inwebsite/modules/robots/index.js:Disallow: //casesquery combinations, publish sitemap URL frombaseUrlwebsite/package.json:@apostrophecms/sitemap@^1.2.0Why
Validation
http://localhost:3000/sitemap.xmlreturns sitemap entries for public pages and case detail pages.robots.txtbehavior in local/dev mode returnsDisallow: /.Notes
baseUrlbeing set tohttps://www.speedandfunction.com.Integrates Apostrophe sitemap and adds SEO controls for case studies: adds canonical links and robots meta tags on listing and detail pages, marking query-parameter variants (filters, searches, pagination) as noindex while keeping primary pages indexable. Adds dynamic, environment-aware /robots.txt (disallow in non-prod; allow in prod with sitemap URL) and adds @apostrophecms/sitemap dependency.